Web Survey Bibliography
In the following five chapters, several methodological innovations in panel surveys are evaluated. In each chapter, one of the methods discussed above to study and correct for measurement errors will be used to study how these methodological innovations affect survey errors and/or substantive conclusions derived from these survey data. The techniques discussed in the different chapters all build on one or more of the basic methods, but describe and explore the techniques in far more detail. In Chapter 3, the technique of propensity score matching is used to study the effects a mixed‐mode respondent recruitment strategy for a survey. It shows how matching can be used to separate nonresponse error from measurement error in a mixed telephone and Internet survey. Separating the two enables us to study how differences between the samples that remain after correcting for nonresponse error persist: the mode effect. In Chapter 4, we turn to the technique of Dependent Interviewing (DI). Different versions of DI are experimentally compared and evaluated using a quasi‐simplex model. This chapter shows how DI and the extent of measurement error present in a survey question on income affects the reliability coefficient. Chapter 5 further explores the use of Dependent Interviewing in panel surveys. This chapter focuses on the effect DI has on substantive estimates that use income questions. Apart from this, details of a validation study using the same income questions shed light on how DI works to affect survey estimates. Chapter 6 focuses on the topic of change in attitude question in a population that experiences a period of life changes. A mixed‐method study that combines longitudinal survey data with qualitative interviews shows how attitudes change over time. Not only do levels of attitudes towards their study change among a group of first year psychology students, the concept of interest itself also changes. The chapter shows how the meaning of study motivation for students itself changes over time. The final chapter focuses on panel attrition. Recent advances in mixture Structural Equation Modeling are used to describe the process of attrition in a panel study with monthly measurements. The chapter shows how different archetypes of respondents drop out of a study in different ways and for different reasons. This chapter concludes by showing how every group of attriters affects longitudinal nonresponse error in a different way.
Igitur Homepage (abstract) / (full text)
Web survey bibliography (4086)
- Detecting Satisficing In Online Surveys: What we found ; 2012; Salifu, S.
- Challenges of assessing the quality of a prerecruited probability-based panel of internet users in...; 2012; Struminskaya, B., Kaczmirek, L.
- Assessing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys; 2012; Behr, D., Braun, M., Kaczmirek, L.
- Impact and the Research Excellence Framework: New challenges for universities; 2012; Grant, J.
- Impact of Fixed Choice Design on Blockmodeling Outcomes; 2012; Znidarsic, A.
- Panel Conditioning in Online Survey Panels: Problems of Increased Sophistication and Decreased Engagemeent...; 2012; Adams, A. N., Atkeson, L. R., Karp, J. A.
- Efficiency of Different Recruitment Strategies for Web Panels; 2012; Hansen, K. M., Pedersen, R. T.
- Understanding Mode Effects between Mobile Web and Mobile SMS Surveys; 2012; Poduska, B., Johnson, E. P.
- Paper-and-Pencil versus Web Administration of a Student Satisfaction Survey; 2012; Bowen, C.-C.
- Nonresponse and Online Student Evaluations of Teaching: Understanding the Influence of Salience...; 2012; Adams, M. J. D., Umbach, P. D.
- Disfluencies and Gaze Aversion in Unreliable Responses to Survey Questions; 2012; Schober, M. F., Conrad, F. G., Dijkstra, W., Ongena, Y. P.
- Use of Paradata in a Responsive Design Framework to Manage a Field Data Collection; 2012; Wagner, J., West, B. T., Kirgis, N., Lepkowski, J. M., Axinn, W., Kruger-Ndiaye, S.
- Recruiting A Probability Sample For An Online Panel: Effects Of Contact Mode, Incentives, And Information...; 2012; Scherpenzeel, A., Toepoel, V.
- Do Questions about Watching Internet Pornography Make People Watch Internet Pornography? A Comparison...; 2012; Peter, J., Valkenburg, P. M.
- “I think I know what you did last summer” Improving data quality in panel surveys; 2012; Lugtig, P. J.
- An Initial Look at Non-Response and Attrition in Understanding Society; 2012; Lynn, P., Burton, J., Kaminska, O., Knies, G., Nandi, A.
- Understanding Society Innovation Panel Wave 4: Results from Methodological Experiments; 2012; Burton, J., Budd, S., Kaminska, O., Uhrig, S. C. N., Brown, M., Calderwood, L.
- The Propensity of Older Respondents to Participate in a General Purpose Survey; 2012; Lynn, P.
- Mode-Switch Protocols: How a Seemingly Small Design Difference can affect Attrition Rates and Attrition...; 2012; Lynn, P.
- Does Giving People Their Preferred Survey Mode Actually Increase Survey Participation Rates?; 2012; Olson, K., Smyth, J. D., Wood, H.
- Adaptive web sampling in ecology; 2012; Thompson, S. K.
- Deep Data: Qualitative Approaches to E-Research in the Digital Age; 2012; Salmons, J.
- Going online with a face-to-face household panel: Initial results from an experiment on the UK Household...; 2012; Jaeckle, A., Lynn, P., Burton, J.
- Opportunities and Challenges for the Digital Researcher; 2012; Blank, G., Morey, Y.
- Measures of Data Quality Across the RDD Frames; 2012; Lavrakas, P. J.
- Reliable Online Social Network Data Collection; 2012; Abdesslem, F. B., Parris, I., Henderson, T.
- Statisticians don’t like non-probability; 2012; Murphy, J.
- Diasporas on the web: new networks, new methodologies; 2012; Crush, J., Eberhardt, C., Caesar, M., Chikanda, A., Pendleton, W., Hill, A.
- Not by the Book: Facebook as a Sampling Frame; 2012; Brickman Bhutta, C.
- Survey Quality Evaluation for Business Surveys; 2012; Biemer, P. P.
- Respondent-driven sampling; 2012; Schonlau, M., Liebau, E.
- Online Data Collection in the Agro-Food Sector; 2012; Biffignandi, S., Artaz, R.
- Collecting data electronically from enterprises – searching for the right approach; 2012; Keating, J., Portillo, S.
- Tailoring the design of e-questionnaires to the response process: About audit trails and other methods...; 2012; Morren, M., Snijkers, G.
- Predicting potential respondents' decision to participate in web surveys; 2012; Fang, J., Wen, C.
- Comparing Ranking Techniques in Web Surveys; 2012; Blasius, J.
- Design of CAWI Instruments for Social Surveys ; 2012; Blanke, K.
- Web Survey Software; 2012; Berzelak, N., Vehovar, V., Slavec, A.
- Surveying general population: What types of experiments are further needed?; 2012; Vehovar, V., Berzelak, N.
- Psychometric properties of an internet administered version of the Marlowe-Crowne Social Desirability...; 2012; Vesteinsdottir, V., Reips, U.-D., Joinson, A. N., Porsdottir, F.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- Mobile Survey Participation Rates in Commercial Market Research: A Meta-Analysis; 2012; Bosnjak, M., Poggio, T., Becker, K. R., Funke, F., Wachenfeld, A., Fischer, B.
- Research design for studying online communities with web surveys; 2012; Petrovcic, A., Petric, G., Lozar Manfreda, K.
- “What a waste of time!” vs “Why not participate?” On sentiments by business...; 2012; Torres van Grinsven, V., Snijkers, G., Daas, P.
- Case study: Respondent perspective on survey response; 2012; Jarrett, C.
- Effect of different stimulus on data quality in online panels; 2012; Zagar, S., Lozar Manfreda, K.
- The German Internet Panel: First Results from the Recruitment Phases; 2012; Blom, A. G.
- Panel retention rate and data quality: experimental results drawing on Reciprocity design; 2012; Biffignandi, S., Artaz, R.
- Analysis of coverage bias for the implementation of web surveys in Spain; 2012; de Pedraza, P., Serrano, F.
- Web panels in Slovenia; 2012; Lenar, J., Vehovar, V.